Skip to content

feat: Improves logger to remove duplication#21

Merged
Chocksy merged 3 commits intomainfrom
feat/improved_logger
Mar 21, 2025
Merged

feat: Improves logger to remove duplication#21
Chocksy merged 3 commits intomainfrom
feat/improved_logger

Conversation

@Chocksy
Copy link
Owner

@Chocksy Chocksy commented Jul 9, 2024

Improves Logger. Tries to verify if a set of logs have already been printed and if not then it starts printing them.
Tries to reduce the amount of logs we consume

Summary by CodeRabbit

  • New Features
    • Advanced logging capabilities now group messages for detailed trend, statistical, and distribution analysis.
    • Introduction of new classes for encapsulating log details and managing related messages.
  • Enhancements
    • Daily log processing is streamlined to consolidate and output logs efficiently during automated workflows.
    • Setup instructions in the README have been updated for clarity and to include new dependency management steps.
    • Transition to PDM for dependency management and testing, replacing previous tools.

These updates deliver more structured log insights and smoother end-of-cycle operations for improved system observability.

@Chocksy Chocksy self-assigned this Jul 9, 2024
@Chocksy Chocksy marked this pull request as draft September 9, 2024 12:50
@Chocksy Chocksy force-pushed the feat/improved_logger branch from 89a5ed1 to acbe735 Compare March 21, 2025 17:01
@coderabbitai
Copy link

coderabbitai bot commented Mar 21, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

The changes enhance the logging system in the application. In Tools/Logger.py, new classes (LogMessage and MessageGroup) encapsulate log details and allow grouping for statistical analysis. The Logger class is updated to initialize its storage, refactor the logging method, and add daily log processing. Additionally, convenience logging methods are streamlined. In main.py, the end-of-day and end-of-algorithm methods now include debug logging and invoke the daily log processing or position storage depending on the live mode. The project also transitions from pipenv to PDM for dependency management.

Changes

File(s) Change Summary
Tools/Logger.py • Added LogMessage and MessageGroup classes to encapsulate log details and group messages
• Updated Logger to initialize logging storage, refactored Log for message grouping with error handling, and introduced process_and_output_daily_logs for daily summaries
• Simplified convenience methods (error, warning, info, debug, trace) and updated dataframe implementation
main.py • Updated CentralAlgorithm.OnEndOfDay to include debug logging for symbols and process daily logs
• Modified CentralAlgorithm.OnEndOfAlgorithm to conditionally store positions in live mode or process daily logs otherwise
.github/workflows/tests.yml • Transitioned from pipenv to PDM for dependency management and testing commands
.pdm-python • Created new file specifying the path to the Python interpreter
Pipfile • Removed file containing project dependencies and configuration
README.md • Enhanced setup instructions to include PDM installation steps
pyproject.toml • Introduced new file with project metadata, dependencies, and PDM configuration
requirements.txt • Removed file listing project dependencies
run_tests.sh • Replaced pipenv commands with pdm commands for test execution
Tests/specs/tools/logger_spec.py • Added new test suite for Logger, testing LogMessage, MessageGroup, and Logger classes with custom matchers

Sequence Diagram(s)

sequenceDiagram
    participant App as Application
    participant Logger as Logger
    participant LG as LogMessage
    participant MG as MessageGroup

    App->>Logger: Log(message, threshold)
    Logger->>LG: Create LogMessage instance
    Logger->>Logger: Determine if message has numeric value
    alt Numeric value present
        Logger->>MG: Add LogMessage to MessageGroup
    else
        Logger->>Logger: Store message in daily log dictionary
    end
    Logger->>Logger: Call _log_immediate for immediate processing
Loading
sequenceDiagram
    participant CA as CentralAlgorithm
    participant Logger as Logger
    participant PS as PositionsStore

    CA->>Logger: OnEndOfDay() invoked
    Logger-->>CA: Debug log for symbol
    CA->>Logger: process_and_output_daily_logs()

    CA->>CA: OnEndOfAlgorithm() invoked
    alt Live Mode
        CA->>PS: store_positions()
    else
        CA->>Logger: process_and_output_daily_logs()
    end
Loading

Poem

Hi-ho, I'm a bunny with cheer,
Logging messages far and near.
New classes and flows make my day,
Sorting logs in a clever way.
Hopping through code with a joyful heart,
Changes so neat—a work of art!
🐇🥕 Happy logging from the burrow smart!


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between a6c0a1b and 4e95730.

⛔ Files ignored due to path filters (2)
  • Pipfile.lock is excluded by !**/*.lock
  • pdm.lock is excluded by !**/*.lock
📒 Files selected for processing (10)
  • .github/workflows/tests.yml (2 hunks)
  • .pdm-python (1 hunks)
  • Pipfile (0 hunks)
  • README.md (1 hunks)
  • Tests/specs/tools/logger_spec.py (1 hunks)
  • Tools/Logger.py (1 hunks)
  • main.py (2 hunks)
  • pyproject.toml (1 hunks)
  • requirements.txt (0 hunks)
  • run_tests.sh (1 hunks)
✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@Chocksy Chocksy marked this pull request as ready for review March 21, 2025 17:01
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (5)
main.py (1)

118-123: Add docstring or comment clarifying log usage
These lines correctly add a debug message and call process_and_output_daily_logs(), ensuring daily logs are summarized. Consider adding a concise docstring or comment explaining how the logger handles daily vs. live mode scenarios to assist future maintainers.

Tools/Logger.py (4)

6-9: Clean up unused imports
deque, OrderedDict, datetime, timedelta, and json are not used in this file. Removing them can reduce clutter.

-from collections import deque, defaultdict, OrderedDict
+from collections import defaultdict
-from datetime import datetime, timedelta
+from datetime import datetime
-import json
🧰 Tools
🪛 Ruff (0.8.2)

6-6: collections.deque imported but unused

Remove unused import

(F401)


6-6: collections.OrderedDict imported but unused

Remove unused import

(F401)


7-7: datetime.datetime imported but unused

Remove unused import

(F401)


7-7: datetime.timedelta imported but unused

Remove unused import

(F401)


9-9: json imported but unused

Remove unused import: json

(F401)


43-179: Robust grouping and analysis in MessageGroup
This design efficiently handles numeric analysis (trend, distribution) and textual grouping. It may be beneficial to handle extremely large message sets with caution, but for typical use cases, this is a clean approach.


193-224: Thoughtful logging logic
Implementing immediate logs for live mode vs. grouping for backtests ensures minimal overhead in real-time trading. As an enhancement, consider explicitly logging the reason why grouping is skipped in live mode for extra clarity.

🧰 Tools
🪛 Ruff (0.8.2)

220-220: Do not use bare except

(E722)


281-297: Destructor usage
Calling process_and_output_daily_logs() in __del__ helps in final cleanup. Be mindful that destructors might not always be called if references persist. Testing on the final run ensures no data is lost.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8dfbfe3 and acbe735.

📒 Files selected for processing (2)
  • Tools/Logger.py (1 hunks)
  • main.py (2 hunks)
🧰 Additional context used
🪛 Ruff (0.8.2)
Tools/Logger.py

6-6: collections.deque imported but unused

Remove unused import

(F401)


6-6: collections.OrderedDict imported but unused

Remove unused import

(F401)


7-7: datetime.datetime imported but unused

Remove unused import

(F401)


7-7: datetime.timedelta imported but unused

Remove unused import

(F401)


9-9: json imported but unused

Remove unused import: json

(F401)


220-220: Do not use bare except

(E722)

🔇 Additional comments (5)
main.py (1)

147-153: Ensure intended end-of-algorithm log handling
Separating the logic for live vs. backtest mode with a position store vs. daily log output is clear. Verify that calling process_and_output_daily_logs() outside live mode is the intended final step, so logs aren’t missed in any future expansions or edge cases.

Tools/Logger.py (4)

11-42: Good encapsulation in LogMessage
The use of hashes for grouping and numeric extraction is well-designed, promoting flexibility and swift grouping in the logger. This class improves readability and modularity.


185-191: Initialize logger storage conditionally
Initializing logger_storage if it doesn't exist ensures data is consolidated in a single place. This is a good approach, but be mindful of concurrent usage across multiple classes if concurrency is later introduced.


225-275: Efficient daily log processing
This method neatly segregates the logs and clears old entries. Confirm that the logic for skipping processing when current_day == self.last_summary_day aligns with your daily reset expectations, especially if running across midnight boundary conditions.


276-280: Immediate logging output
Formatting logs immediately for production or live mode can be invaluable for monitoring. The approach is straightforward and consistent with the rest of the logger design.

@codecov
Copy link

codecov bot commented Mar 21, 2025

Codecov Report

Attention: Patch coverage is 82.01058% with 34 lines in your changes missing coverage. Please review.

Project coverage is 72.04%. Comparing base (2c53c7c) to head (4e95730).
Report is 4 commits behind head on main.

✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
Tools/Logger.py 82.01% 34 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main      #21      +/-   ##
==========================================
+ Coverage   71.33%   72.04%   +0.71%     
==========================================
  Files          30       30              
  Lines        2693     2851     +158     
==========================================
+ Hits         1921     2054     +133     
- Misses        772      797      +25     
Flag Coverage Δ
unittests 72.04% <82.01%> (+0.71%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
README.md (1)

6-21: Clarify and Standardize Installation Instructions.
The revised setup steps clearly guide users to install PDM and adjust subsequent workflow steps accordingly. Consider ensuring consistent capitalization (e.g., use "QuantConnect" instead of "quantconnect") and punctuation throughout the instructions. Also, verify that the renumbering of steps (from 1 to 8) reads naturally for users.

.github/workflows/tests.yml (1)

31-32: Use of PDM for Installing Dependencies.
Switching to pdm install along with listing the installed packages via pdm list helps with debugging. If the debug output isn’t intended for production, consider documenting its purpose or removing it.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between acbe735 and a6c0a1b.

⛔ Files ignored due to path filters (2)
  • Pipfile.lock is excluded by !**/*.lock
  • pdm.lock is excluded by !**/*.lock
📒 Files selected for processing (7)
  • .github/workflows/tests.yml (2 hunks)
  • .pdm-python (1 hunks)
  • Pipfile (0 hunks)
  • README.md (1 hunks)
  • pyproject.toml (1 hunks)
  • requirements.txt (0 hunks)
  • run_tests.sh (1 hunks)
💤 Files with no reviewable changes (2)
  • requirements.txt
  • Pipfile
✅ Files skipped from review due to trivial changes (2)
  • .pdm-python
  • pyproject.toml
🔇 Additional comments (6)
.github/workflows/tests.yml (3)

24-28: Transition to PDM for Dependency Management.
The addition of the "Install PDM" step (using pip install pdm) properly replaces the old pipenv setup. This change is clear and necessary for standardizing dependency management.


36-38: Running Tests with PDM.
The updated command (CI=true pdm run ./run_tests.sh) correctly transitions the test execution from pipenv to PDM. Verify that the CI environment variable is properly propagated in all CI environments to ensure consistent behavior.


56-59: Updating Artifact Upload Action.
Updating to actions/upload-artifact@v4 is a good move to leverage the latest features and improvements. Ensure that any configuration changes required by version 4 are addressed or documented as needed.

run_tests.sh (3)

36-40: Migrating Test Execution to PDM Run Commands.
The CI condition now appropriately uses pdm run mamba Tests/specs --enable-coverage --format=junit > junit.xml, while the non-CI branch runs without the XML output. This migration from pipenv to PDM appears to be correctly implemented. Confirm that the generated test artifacts (like junit.xml) are properly integrated into the CI reporting pipeline.


43-45: HTML Coverage Report Generation with PDM.
The command to generate HTML coverage reports using pdm run coverage html along with the specified include and omit options is correctly configured. Ensure that the --include and --omit patterns accurately reflect the desired files and directories.


48-52: Conditional XML Coverage Report Generation in CI.
Generating the XML coverage report only when $CI is set to true is an efficient approach. Verify that the XML output conforms to the requirements of your coverage analysis tools (e.g., Codecov).

@Chocksy Chocksy force-pushed the feat/improved_logger branch from a6c0a1b to 4e95730 Compare March 21, 2025 17:37
@Chocksy Chocksy merged commit f3b1726 into main Mar 21, 2025
3 of 4 checks passed
@Chocksy Chocksy deleted the feat/improved_logger branch March 21, 2025 17:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant